Linear dimensionality reduction by maximizing the Chernoff distance in the transformed space
نویسندگان
چکیده
Linear dimensionality reduction (LDR) techniques are quite important in pattern recognition due to their linear time complexity and simplicity. In this paper, we present a novel LDR technique which, though linear, aims to maximize the Chernoff distance in the transformed space; thus, augmenting the class separability in such a space. We present the corresponding criterion, which is maximized via a gradient-based algorithm, and provide convergence and initialization proofs. We have performed a comprehensive performance analysis of our method combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data, and compared it with other LDR techniques. The results on synthetic and standard real-life datasets show that the proposed criterion outperforms the latter when combined with both linear and quadratic classifiers. ∗Member of the IEEE. Department of Computer Science, University of Concepción, Edmundo Larenas 215, Concepción, 4030000, Chile. Phone: +56 41 220-4305, Fax: +56 41 222-1770. E-mail: [email protected]. Partially supported by the Chilean National Fund for Scientific and Technological Development, FONDECYT grant No. 1060904. †Institute of Informatics, National University of San Juan, Cereceto y Meglioli, San Juan, 5400, Argentina. E-mail: [email protected]
منابع مشابه
A New Linear Dimensionality Reduction Technique Based on Chernoff Distance
A new linear dimensionality reduction (LDR) technique for pattern classification and machine learning is presented, which, though linear, aims at maximizing the Chernoff distance in the transformed space. The corresponding two-class criterion, which is maximized via a gradient-based algorithm, is presented and initialization procedures are also discussed. Empirical results of this and tradition...
متن کاملOn the Performance of Chernoff-Distance-Based Linear Dimensionality Reduction Techniques
We present a performance analysis of three linear dimensionality reduction techniques: Fisher’s discriminant analysis (FDA), and two methods introduced recently based on the Chernoff distance between two distributions, the Loog and Duin (LD) method, which aims to maximize a criterion derived from the Chernoff distance in the original space, and the one introduced by Rueda and Herrera (RH), whic...
متن کاملImproving Chernoff criterion for classification by using the filled function
Linear discriminant analysis is a well-known matrix-based dimensionality reduction method. It is a supervised feature extraction method used in two-class classification problems. However, it is incapable of dealing with data in which classes have unequal covariance matrices. Taking this issue, the Chernoff distance is an appropriate criterion to measure distances between distributions. In the p...
متن کاملA Theoretical Comparison of Two Linear Dimensionality Reduction Techniques
A theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques, namely Fisher’s discriminant (FD) and Loog-Duin (LD) dimensionality reduciton, is presented. The necessary and sufficient conditions for which FD and LD provide the same linear transformation are discussed and proved. To derive these conditions, it is first shown that the two criteria preserve the same ma...
متن کاملA theoretical comparison of two-class Fisher's and heteroscedastic linear dimensionality reduction schemes
We present a theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques for two classes, a homoscedastic LDR scheme, Fisher’s discriminant (FD), and a heteroscedastic LDR scheme, Loog-Duin (LD). We formalize the necessary and sufficient conditions for which the FD and LD criteria are maximized for the same linear transformation. To derive these conditions, we first ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Pattern Recognition
دوره 41 شماره
صفحات -
تاریخ انتشار 2008